Computing OWA weights as relevance factors

نویسندگان

  • Angel Caţaron
  • Răzvan Andonie
چکیده

* On leave of absence from the Department of Electronics and Computers, Transylvania University of Braşov. Abstract – Ordered Weighted Aggregation (OWA) operators represent a distinct family of aggregation operators and were introduced by Yager in [1]. They compute a weighted sum of a number of criteria that must be satisfied. The central element of the OWA operators is that the criteria are reordered before aggregation and therefore a particular weight is associated to a position. Relevance Learning Vector Quantization (RLVQ) [2] is an extension of the Learning Vector Quantization (LVQ) algorithm [3] and performs a heuristic determination of the relevance factors of the input dimensions. This method is based on Hebbian learning and associates a weight factor to each dimension of the input vectors. We present a LVQ method for on-line computing of the OWA weights as relevance factors. The method uses a weighted metric based on OWA restrictions. The principal benefit of our algorithm is that it connects two distinct topics: RLVQ algorithm and the consistent mathematical model of the OWA operators.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

RLVQ determination using OWA operators

Relevance Learning Vector Quantization (RLVQ) (introduced in [1]) is a variation of Learning Vector Quantization (LVQ) which allows a heuristic determination of relevance factors for the input dimensions. The method is based on Hebbian learning and defines weighting factors of the input dimensions which are automatically adapted to the specific problem. These relevance factors increase the over...

متن کامل

On the Least Squared Ordered Weighted Averaging (LSOWA) Operator Weights

The ordered weighted averaging (OWA) operator by Yager has received more and more attention since its appearance. One key point in the OWA operator is to determine its associated weights. Among numerous methods that have appeared in the literature, we notice the maximum entropy OWA (MEOWA) weights that are determined by taking into account two appealing measures characterizing the OWA weights. ...

متن کامل

Type-1 OWA operators for aggregating uncertain information with uncertain weights induced by type-2 linguistic quantifiers

The OWA operator proposed by Yager has been widely used to aggregate experts’ opinions or preferences in human decision making. Yager’s traditional OWA operator focuses exclusively on the aggregation of crisp numbers. However, experts usually tend to express their opinions or preferences in a very natural way via linguistic terms. These linguistic terms can be modelled or expressed by (type-1) ...

متن کامل

Parametric Extension of the Most Preferred OWA Operator and Its Application in Search Engine's Rank

Most preferred ordered weighted average (MP-OWA) operator is a new kind of neat (dynamic weight) OWA operator in the aggregation operator families. It considers the preferences of all alternatives across the criteria and provides unique aggregation characteristics in decision making. In this paper, we propose the parametric form of the MP-OWA operator to deal with the uncertainty preference inf...

متن کامل

Choosing OWA operator weights in the field of Social Choice

One of the most important issues in the theory of OWA operators is the determination of associated weights. This matter is essential in order to use the best-suited OWA operator in each aggregation process. Given that some aggregation processes can be seen as extensions of majority rules to the field of gradual preferences, it is possible to determine the OWA operator weights by taking into acc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004